social license
Why a Social License is Needed for AI
If business wants to use AI at scale, adhering to the technical guidelines for responsible AI development isn't enough. It must obtain society's explicit approval to deploy the technology. Six years ago, in March 2016, Microsoft Corporation launched an experimental AI-based chatbot, TayTweets, whose Twitter handle was @TayandYou. Tay, an acronym for "thinking about you," mimicked a 19-year-old American girl online, so the digital giant could showcase the speed at which AI can learn when it interacts with human beings. Living up to its description as "AI with zero chill," Tay started off replying cheekily to Twitter users and turning photographs into memes. Some topics were off limits, though; Microsoft had trained Tay not to comment on societal issues such as Black Lives Matter. Soon enough, a group of Twitter users targeted Tay with a barrage of tweets about controversial issues such as the Holocaust and Gamergate. They goaded the chatbot into replying with racist and sexually charged responses, exploiting its repeat-after-me capability. Realizing that Tay was reacting like IBM's Watson, which started using profanity after perusing the online Urban Dictionary, Microsoft was quick to delete the first inflammatory tweets. Less than 16 hours and more than 100,000 tweets later, the digital giant shut down Tay.
- Europe > United Kingdom (0.28)
- North America > United States (0.28)
- Atlantic Ocean (0.14)
- Africa (0.14)
- Materials (1.00)
- Information Technology > Security & Privacy (1.00)
- Government (1.00)
- (5 more...)
Does Artificial Intelligence (AI) Need a Social Licence?
According to the Boston Consulting Group (BCG), companies have no option but to acquire a social license for AI. When Mary Shelley wrote Frankenstein in 1818, she was writing about technology. Dr. Victor Frankenstein created a man who becomes a monster – leaping over his creator's expectations and terrifying the townspeople until his creator shuts him down. One wonders, if Frankenstein had been given the right circumstances, would the story have ended in triumph? Reading Boston Consulting Group's article "Why AI Needs a Social License" reminded me of Shelley's classic.
Why AI Needs a Social License
If business wants to use AI at scale, adhering to the technical guidelines for responsible AI development isn't enough. It must obtain society's explicit approval to deploy the technology. Six years ago, in March 2016, Microsoft Corporation launched an experimental AI-based chatbot, TayTweets, whose Twitter handle was @TayandYou. Tay, an acronym for "thinking about you," mimicked a 19-year-old American girl online, so the digital giant could showcase the speed at which AI can learn when it interacts with human beings. Living up to its description as "AI with zero chill," Tay started off replying cheekily to Twitter users and turning photographs into memes. Some topics were off limits, though; Microsoft had trained Tay not to comment on societal issues such as Black Lives Matter. Soon enough, a group of Twitter users targeted Tay with a barrage of tweets about controversial issues such as the Holocaust and Gamergate. They goaded the chatbot into replying with racist and sexually charged responses, exploiting its repeat-after-me capability. Realizing that Tay was reacting like IBM's Watson, which started using profanity after perusing the online Urban Dictionary, Microsoft was quick to delete the first inflammatory tweets. Less than 16 hours and more than 100,000 tweets later, the digital giant shut down Tay.
- Europe > United Kingdom (0.28)
- North America > United States (0.28)
- Atlantic Ocean (0.14)
- Africa (0.14)
- Materials (1.00)
- Information Technology > Security & Privacy (1.00)
- Government (1.00)
- (5 more...)
AI-at-Scale Hinges on Gaining a 'Social License'
In January 2020, an unknown American facial recognition software company, Clearview AI, was thrust into the limelight. It had quietly flown under the radar until The New York Times reported that businesses, law enforcement agencies, universities, and individuals had been purchasing its sophisticated facial recognition software, whose algorithm could match human faces to a database of over 3 billion images the company had collected from the internet. The article renewed the global debate about the use of AI-based facial recognition technology by governments and law enforcement agencies. Many people called for a ban on the use of the Clearview AI technology because the startup had created its database by mining social media websites and the internet for photographs but hadn't obtained permission to index individuals' faces. Twitter almost immediately sent the company a cease-and-delete letter, and YouTube and Facebook followed suit.
- Information Technology > Communications > Social Media (1.00)
- Information Technology > Artificial Intelligence > Vision > Face Recognition (1.00)
Why Talking about ethics is not enough: a proposal for Fintech's AI ethics
de Oliveira, Cristina Godoy Bernardo, Ruiz, Evandro Eduardo Seron
As the potential applications of Artificial Intelligence (AI) in the financial sector increases, ethical issues become gradually latent. The distrust of individuals, social groups, and governments about the risks arising from Fintech's activities is growing. Due to this scenario, the preparation of recommendations and Ethics Guidelines is increasing and the risks of being chosen the principles and ethical values most appropriate to companies are high. Thus, this exploratory research aims to analyze the benefits of the application of the stakeholder theory and the idea of Social License to build an environment of trust and for the realization of ethical principles by Fintech. The formation of a Fintech association for the creation of a Social License will allow early-stage Fintech to participate from the beginning of its activities in the elaboration of a dynamic ethical code and with the participation of stakeholders.
- South America > Brazil > São Paulo (0.04)
- South America > Bolivia (0.04)
- North America > United States > District of Columbia > Washington (0.04)
- (2 more...)
- Law > Statutes (0.70)
- Law > Civil Rights & Constitutional Law (0.46)
- Banking & Finance > Credit (0.46)
Artificial intelligence impact on society
Three friends were having morning tea on a farm in the Northern Rivers region in New South Wales (NSW), Australia, when they noticed a drilling rig setting up in a neighbor's property on the opposite side of the valley. They had never heard of the coal seam gas (CSG) industry, nor had they previously considered activism. That drilling rig, however, was enough to push them into action. The group soon became instrumental in establishing the anti-CSG movement, a movement whose activism resulted in the NSW government suspending gas exploration licenses in the area in 2014.2 By 2015, the government had bought back a petroleum exploration license covering 500,000 hectares across the region.3 Mining companies, like companies in many industries, have been struggling with the difference between having a legal license to operate and a moral4 one. The colloquial version of this is the distinction between what one could do and what one should do--just because something is technically possible and economically feasible doesn't mean that the people it affects will find it morally acceptable. Without the acceptance of the community, firms find themselves dealing with "never-ending demands" from "local troublemakers" hearing that "the company has done nothing for us"--all resulting in costs, financial and nonfinancial,5 that weigh projects down. A company can have the best intentions, investing in (what it thought were) all the right things, and still experience opposition from within the community. It may work to understand local mores and invest in the community's social infrastructure--improving access to health care and education, upgrading roads and electricity services, and fostering economic activity in the region resulting in bustling local businesses and a healthy employment market--to no avail. Without the community's acceptance, without a moral license, the mining companies in NSW found themselves struggling. This moral license is commonly called a social license, a phrase coined in the '90s, and represents the ongoing acceptance and approval of a mining development by a local community. Since then, it has become increasingly recognized within the mining industry that firms must work with local communities to obtain, and then maintain, a social license to operate (SLO).6 The concept of a social license to operate has developed over time and been adopted by a range of industries that affect the physical environment they operate in, such as logging or pulp and paper mills. What has any of this to do with artificial intelligence (AI)?
- Materials (1.00)
- Health & Medicine (1.00)
- Energy > Oil & Gas > Upstream (1.00)
- Government > Regional Government > Oceania Government > Australia Government > New South Wales Government (0.44)